-
Notifications
You must be signed in to change notification settings - Fork 2.8k
ZEPPELIN-3552. Support Scala 2.12 of SparkInterpreter #3034
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
hmm, Spark doesn't support Scala 2.12? |
|
The first scala supported spark is 2.4.0 IIUC |
felixcheung
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that there are a few unresolved subtasks that no one is working on at the moment, I'm not sure if this would be resolved by Spark 2.4.0 release. So my thought would be not yet for now.
|
Spark 2.4 will officially support Scala 2.12, so it will be great if Zeppelin will support it together with Spark. And also, there are some libs that are Scala 2.12 only |
|
We would try to support it after spark 2.4 release. |
|
|
||
| sparkILoop.in = reader | ||
| sparkILoop.initializeSynchronous() | ||
| callMethod(sparkILoop, "scala$tools$nsc$interpreter$ILoop$$loopPostInit") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm .. weird. From my testing against Spark 2.4.0 RC, this failed to find the method because it became a nested function (https://github.com/scala/scala/blob/v2.12.6/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L993). Let me try to manually test against this one.
| import scala.tools.nsc.interpreter._ | ||
|
|
||
| /** | ||
| * SparkInterpreter for scala-2.11 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
scala-2.11 -> scala-2.12 :-)
The default distribution will still be with Scala 2.11 for Spark 2.4 if I am not mistaken. It is nice to support it but Spark 2.4 with 2.11 should be supported first as a higher priority. I can work on 2.4.0 with 2.12 support further after this one got merged. |
|
Hey Jeff, I can take over this one too if you're busy since I took a look for similar code paths. |
|
Hi @felixcheung @zjffdu any news on this? Spark 3.0 (which will be out late of summer) seems to remove the support to v 2.11 so it would be great merge this ASAP |
|
Sorry, actually I am stuck in some works .. I think I wouldn't be able to take over this. |
|
Don't worry @HyukjinKwon let me continue this. |
03d3122 to
657a443
Compare
7581580 to
7feee7b
Compare
| <scala.version>2.10.5</scala.version> | ||
| <scala.binary.version>2.10</scala.binary.version> | ||
| <scalatest.version>2.2.4</scalatest.version> | ||
| <scalatest.version>3.0.7</scalatest.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3.0.7 support both scala 2.10/2.11/2.12
|
whoa long time |
|
Scala 2.12 is no longer experimental in Spark 2.4.1 so I suggest targeting 2.4.1 |
|
Any ETA for this? |
|
@conker84 I make some updates to this PR, and it basically works in my local environment. But I still need more time to refine this PR. I suppose this could be done in 0.9. |
|
Ok thanks! |
608046f to
ddcf7b7
Compare
|
Any news on this? |
|
Ping |
|
@conker84 I am busy on other stuff recently, but make some progress on this. The plan is still completing it before 0.9 which is planed to be released this summer. |
56e4c56 to
ef2c6af
Compare
|
Folks, This PR is ready for review. Now scala 2.12 is supported. |
|
LGTM |
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM too
|
Will merge if no more comments |
What is this PR for?
This PR add support for scala 2.12 of SparkInterpreter. In this PR, I did some refactoring of whole spark modules. Each scala version interrpeter will be loaded dynamically via URLClassLoad, so that we can just write code once and compile it multiple times via different scala version and load it dynamically based on the current scala version.
What type of PR is it?
[Feature | Refactoring]
Todos
What is the Jira issue?
How should this be tested?
Screenshots (if appropriate)
Questions: